The Boosted Difference of Convex Functions Algorithm

Vuong Phan (University of Southampton)

09-Jun-2021, 07:00-08:00 (5 years ago)

Abstract: We introduce a new algorithm for solving Difference of Convex functions (DC) programming, called Boosted Difference of Convex functions Algorithm (BDCA). BDCA accelerates the convergence of the classical difference of convex functions algorithm (DCA) thanks to an additional line search step. We prove that any limit point of the BDCA iterative sequence is a critical point of the problem under consideration and that the corresponding objective value is monotonically decreasing and convergent. The global convergence and convergence rate of the iterations are obtained under the Kurdyka Lojasiewicz property. We provide applications and numerical experiments for a hard problem in biochemistry and two challenging problems in machine learning, demonstrating that BDCA outperforms DCA. For the biochemistry problem, BDCA was ve times faster than DCA, for the Minimum Sum-of-Squares Clustering problem, BDCA was on average sixteen times faster than DCA, and for the Multidimensional Scaling problem, BDCA was three times faster than DCA.

Joint work with Francisco J. Aragon Artacho (University of Alicante, Spain).

optimization and control

Audience: researchers in the topic


Variational Analysis and Optimisation Webinar

Series comments: Register on www.mocao.org/va-webinar/ to receive information about the zoom connection.

Organizers: Hoa Bui*, Matthew Tam*, Minh Dao, Alex Kruger, Vera Roshchina*, Guoyin Li
*contact for this listing

Export talk to